首页> 外文OA文献 >BoostingTree: parallel selection of weak learners in boosting, with application to ranking
【2h】

BoostingTree: parallel selection of weak learners in boosting, with application to ranking

机译:BoostingTree:并行选择弱势学习者,并将其应用于排名

代理获取
本网站仅为用户提供外文OA文献查询和代理获取服务,本网站没有原文。下单后我们将采用程序或人工为您竭诚获取高质量的原文,但由于OA文献来源多样且变更频繁,仍可能出现获取不到、文献不完整或与标题不符等情况,如果获取不到我们将提供退款服务。请知悉。
获取外文期刊封面目录资料

摘要

Boosting algorithms have been found successful in many areas of machine learning and, in particular, in ranking. For typical classes of weak learners used in boosting (such as decision stumps or trees), a large feature space can slow down the training, while a long sequence of weak hypotheses combined by boosting can result in a computationally expensive model. In this paper we propose a strategy that builds several sequences of weak hypotheses in parallel, and extends the ones that are likely to yield a good model. The weak hypothesis sequences are arranged in a boosting tree, and new weak hypotheses are added to promising nodes (both leaves and inner nodes) of the tree using some randomized method. Theoretical results show that the proposed algorithm asymptotically achieves the performance of the base boosting algorithm applied. Experiments are provided in ranking web documents and move ordering in chess, and the results indicate that the new strategy yields better performance when the length of the sequence is limited, and converges to similar performance as the original boosting algorithms otherwise. © 2013 The Author(s).
机译:在机器学习的许多领域,尤其是在排名中,已经发现Boosting算法是成功的。对于用于增强的典型弱学习者类别(例如决策树桩或树),较大的特征空间可能会减慢训练速度,而通过增强结合的一连串弱假设可能会导致计算量大的模型。在本文中,我们提出了一种策略,该策略可并行构建多个弱假设序列,并扩展可能产生良好模型的序列。弱假设序列排列在提升树中,并使用某种随机方法将新的弱假设添加到树的有希望的节点(叶子和内部节点)。理论结果表明,该算法渐近实现了所应用的基本boost算法的性能。在对网络文档进行排名和在国际象棋中进行移动排序方面提供了实验,结果表明,当序列长度受到限制时,新策略会产生更好的性能,否则会收敛到与原始提升算法相似的性能。 ©2013作者。

著录项

相似文献

  • 外文文献
  • 中文文献
  • 专利
代理获取

客服邮箱:kefu@zhangqiaokeyan.com

京公网安备:11010802029741号 ICP备案号:京ICP备15016152号-6 六维联合信息科技 (北京) 有限公司©版权所有
  • 客服微信

  • 服务号